Rejoinder: Boosting Algorithms: Regularization, Prediction and Model Fitting

نویسندگان

  • Peter Bühlmann
  • Torsten Hothorn
چکیده

We are grateful that Hastie points out the connection to degrees of freedom for LARS which leads to another—and often better—definition of degrees of freedom for boosting in generalized linear models. As Hastie writes and as we said in the paper, our formula for degrees of freedom is only an approximation: the cost of searching, for example, for the best variable in componentwise linear least squares or componentwise smoothing splines, is ignored. Hence, our approximation formula

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Boosting Algorithms: Regularization, Prediction and Model Fitting

We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selectio...

متن کامل

BOOSTING ALGORITHMS : REGULARIZATION , PREDICTION AND MODEL FITTING By Peter Bühlmann and Torsten Hothorn

We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selectio...

متن کامل

Discussion of “ Boosting Algorithms : Regularization , Prediction and Model Fitting ” by Peter Bühlmann and Torsten Hothorn

We congratulate the authors (hereafter BH) for an interesting take on the boosting technology, and for developing a modular computational environment in R for exploring their models. Their use of low-degree-of-freedom smoothing splines as a base learner provides an interesting approach to adaptive additive modeling. The notion of “Twin Boosting” is interesting as well; besides the adaptive lass...

متن کامل

Comment: Boosting Algorithms: Regularization, Prediction and Model Fitting

We congratulate the authors (hereafter BH) for an interesting take on the boosting technology, and for developing a modular computational environment in R for exploring their models. Their use of low-degree-offreedom smoothing splines as a base learner provides an interesting approach to adaptive additive modeling. The notion of “Twin Boosting” is interesting as well; besides the adaptive lasso...

متن کامل

Estimation and regularization techniques for regression models with multidimensional prediction functions

Boosting is one of the most important methods for fitting regression models and building prediction rules. A notable feature of boosting is that the technique can be modified such that it includes a built-in mechanism for shrinking coefficient estimates and variable selection. This regularization mechanism makes boosting a suitable method for analyzing data characterized by small sample sizes a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008